Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory

نویسندگان

  • Benno Lüders
  • Mikkel Schläger
  • Aleksandra Korach
  • Sebastian Risi
چکیده

Training neural networks to quickly learn new skills without forgetting previously learned skills is an important open challenge in machine learning. A common problem for adaptive networks that can learn during their lifetime is that the weights encoding a particular task are often overridden when a new task is learned. This paper takes a step in overcoming this limitation by building on the recently proposed Evolving Neural Turing Machine (ENTM) approach. In the ENTM, neural networks are augmented with an external memory component that they can write to and read from, which allows them to store associations quickly and over long periods of time. The results in this paper demonstrate that the ENTM is able to perform one-shot learning in reinforcement learning tasks without catastrophic forgetting of previously stored associations. Additionally, we introduce a new ENTM default jump mechanism that makes it easier to find unused memory location and therefor facilitates the evolution of continual learning networks. Our results suggest that augmenting evolving networks with an external memory component is not only a viable mechanism for adaptive behaviors in neuroevolution but also allows these networks to perform continual and one-shot learning at the same time.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Continual One-Shot Learning of Hidden Spike-Patterns with Neural Network Simulation Expansion and STDP Convergence Predictions

This paper presents a constructive algorithm that achieves successful oneshot learning of hidden spike-patterns in a competitive detection task. It has previously been shown (Masquelier et al., 2008) that spike-timing-dependent plasticity (STDP) and lateral inhibition can result in neurons competitively tuned to repeating spike-patterns concealed in high rates of overall presynaptic activity. O...

متن کامل

One-shot Learning with Memory-Augmented Neural Networks

Despite recent breakthroughs in the applications of deep neural networks, one setting that presents a persistent challenge is that of “one-shot learning.” Traditional gradient-based networks require a lot of data to learn, often through extensive iterative training. When new data is encountered, the models must inefficiently relearn their parameters to adequately incorporate the new information...

متن کامل

Meta-Learning with Memory-Augmented Neural Networks

Despite recent breakthroughs in the applications of deep neural networks, one setting that presents a persistent challenge is that of “one-shot learning.” Traditional gradient-based networks require a lot of data to learn, often through extensive iterative training. When new data is encountered, the models must inefficiently relearn their parameters to adequately incorporate the new information...

متن کامل

Modelling of Conventional and Severe Shot Peening Influence on Properties of High Carbon Steel via Artificial Neural Network

Shot peening (SP), as one of the severe plastic deformation (SPD) methods is employed for surface modification of the engineering components by improving the metallurgical and mechanical properties. Furthermore artificial neural network (ANN) has been widely used in different science and engineering problems for predicting and optimizing in the last decade. In the present study, effects of conv...

متن کامل

Continual Learning through Evolvable Neural Turing Machines

Continual learning, i.e. the ability to sequentially learn tasks without catastrophic forgetting of previously learned ones, is an important open challenge in machine learning. In this paper we take a step in this direction by showing that the recently proposed Evolving Neural Turing Machine (ENTM) approach is able to perform one-shot learning in a reinforcement learning task without catastroph...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017